fix: make heavy ML dependencies optional for lightweight installs#57
Merged
fix: make heavy ML dependencies optional for lightweight installs#57
Conversation
Move torch, torchvision, bitsandbytes, peft, and transformers from required dependencies to [project.optional-dependencies.training]. Wrap all top-level imports of these packages in try/except ImportError so the package can be imported without them installed. This unblocks lightweight consumers (e.g. Wright worker installing openadapt-evals) that don't need local model training/inference. Users who need training can install with: pip install openadapt-ml[training] Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2 tasks
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
[project.optional-dependencies.training]try/except ImportErrorwith clear error messages when users try to instantiate training classes without themtraining/grpo/__init__.pylazy-load the torch-dependentGRPOTrainermoduleMotivation
The Wright worker needs to install openadapt-evals (which depends on openadapt-ml), but doesn't need local model training. The current required deps pull in torch (873MB), torchvision, bitsandbytes, peft, and transformers, making the install massive and slow for lightweight consumers.
After this change:
pip install openadapt-mlinstalls only lightweight deps (~anthropic, click, pillow, pydantic-settings, etc.)pip install openadapt-ml[training]installs everything needed for local model training/inferenceimport openadapt_mlworks without torch installedBaseVLMAdapter,QwenVLAdapter,DummyAdapter, orNextActionDatasetwithout torch raises a clearImportErrorwith install instructionsTest plan
python -c "import openadapt_ml"works without torchpython -c "from openadapt_ml.datasets.next_action import build_next_action_sft_samples"works without torchpip install openadapt-ml[training]pulls in torch and training works as before🤖 Generated with Claude Code